Beyond Convex? Global Optimization is Feasible Only for Convex Objective Functions: A Theorem
نویسندگان
چکیده
It is known that there are feasible algorithms for minimizing convex functions, and that for general functions, global minimization is a difficult (NP-hard) problem. It is reasonable to ask whether there exists a class of functions that is larger than the class of all convex functions for which we can still solve the corresponding minimization problems feasibly. In this paper, we prove, in essence, that no such more general class exists. In other words, we prove that global optimization is always feasible only for convex objective functions.
منابع مشابه
Egoroff Theorem for Operator-Valued Measures in Locally Convex Cones
In this paper, we define the almost uniform convergence and the almost everywhere convergence for cone-valued functions with respect to an operator valued measure. We prove the Egoroff theorem for Pvalued functions and operator valued measure θ : R → L(P, Q), where R is a σ-ring of subsets of X≠ ∅, (P, V) is a quasi-full locally convex cone and (Q, W) is a locally ...
متن کاملFoundations of Gauge and Perspective Duality
Common numerical methods for constrained convex optimization are predicated on efficiently computing nearest points to the feasible region. The presence of a design matrix in the constraints yields feasible regions with more complex geometries. When the functional components are gauges, there is an equivalent optimization problem—the gauge dual—where the matrix appears only in the objective fun...
متن کاملFunctionally closed sets and functionally convex sets in real Banach spaces
Let $X$ be a real normed space, then $C(subseteq X)$ is functionally convex (briefly, $F$-convex), if $T(C)subseteq Bbb R $ is convex for all bounded linear transformations $Tin B(X,R)$; and $K(subseteq X)$ is functionally closed (briefly, $F$-closed), if $T(K)subseteq Bbb R $ is closed for all bounded linear transformations $Tin B(X,R)$. We improve the Krein-Milman theorem ...
متن کاملOn the Extensions of Frank - Wolfe Theorem
In this paper we consider optimization problems de ned by a quadratic objective function and a nite number of quadratic inequality constraints. Given that the objective function is bounded over the feasible set, we present a comprehensive study of the conditions under which the optimal solution set is nonempty, thus extending the so-called Frank-Wolfe theorem. In particular, we rst prove a gene...
متن کاملBeyond Convex Optimization: Star-Convex Functions
We introduce a polynomial time algorithm for optimizing the class of star-convex functions, under no Lipschitz or other smoothness assumptions whatsoever, and no restrictions except exponential boundedness on a region about the origin, and Lebesgue measurability. The algorithm’s performance is polynomial in the requested number of digits of accuracy and the dimension of the search domain. This ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- J. Global Optimization
دوره 33 شماره
صفحات -
تاریخ انتشار 2005